Derivative-free Efficient Global Optimization on High-dimensional Simplex

نویسنده

  • Priyam Das
چکیده

In this paper, we develop a novel derivative-free deterministic greedy algorithm for global optimization of any objective function of parameters belonging to a unit-simplex. Main principle of the proposed algorithm is making jumps of varying step-sizes within the simplex parameter space and searching for the best direction to move in a greedy manner. Unlike most of the other existing methods of constraint optimization, here the objective function is evaluated at independent directions within an iteration. Thus incorporation of parallel computing makes it even faster. Requirement of parallelization grows only in the order of the dimension of the parameter space, which makes it more convenient for solving high-dimensional optimization problems in simplex parameter space using parallel computing. A comparative study of the performances of this algorithm and other existing algorithms have been shown for some moderate and high-dimensional optimization problems along with some transformed benchmark test-functions on simplex. Around 20 – 300 folds improvement in computation time has been achieved using the proposed algorithm over Genetic algorithm with more accurate solution.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Augmented Downhill Simplex a Modified Heuristic Optimization Method

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...

متن کامل

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

Model Selection for Support Vector Classifiers via Direct Simplex Search

This paper addresses the problem of tuning hyperparameters in support vector machine modeling. A Direct Simplex Search (DSS) method, which seeks to evolve hyperparameter values using an empirical error estimate as steering criterion, is proposed and experimentally evaluated on real-world datasets. DSS is a robust hill climbing scheme, a popular derivative-free optimization method, suitable for ...

متن کامل

An Improved Hybrid Genetic Algorithm with a New Local Search Procedure

A hybrid genetic algorithm (HGA) combines a genetic algorithm (GA) with an individual learning procedure. One such learning procedure is a local search technique (LS) used by the GA for refining global solutions. A HGA is also called a memetic algorithm (MA), one of the most successful and popular heuristic search methods. An important challenge of MAs is the trade-off between global and local ...

متن کامل

تعیین ماشین‌های بردار پشتیبان بهینه در طبقه‌بندی تصاویر فرا طیفی بر مبنای الگوریتم ژنتیک

Hyper spectral remote sensing imagery, due to its rich source of spectral information provides an efficient tool for ground classifications in complex geographical areas with similar classes. Referring to robustness of Support Vector Machines (SVMs) in high dimensional space, they are efficient tool for classification of hyper spectral imagery. However, there are two optimization issues which s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1604.08636  شماره 

صفحات  -

تاریخ انتشار 2016